Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract Due to their short timescale, stellar flares are a challenging target for the most modern synoptic sky surveys. The upcoming Vera C. Rubin Legacy Survey of Space and Time (LSST), a project designed to collect more data than any precursor survey, is unlikely to detect flares with more than one data point in its main survey. We developed a methodology to enable LSST studies of stellar flares, with a focus on flare temperature and temperature evolution, which remain poorly constrained compared to flare morphology. By leveraging the sensitivity expected from the Rubin system, differential chromatic refraction (DCR) can be used to constrain flare temperature from a single-epoch detection, which will enable statistical studies of flare temperatures and constrain models of the physical processes behind flare emission using the unprecedentedly high volume of data produced by Rubin over the 10 yr LSST. We model the refraction effect as a function of the atmospheric column density, photometric filter, and temperature of the flare, and show that flare temperatures at or above ∼4000 K can be constrained by a singleg-band observation at air massX≳ 1.2, given the minimum specified requirement on the single-visit relative astrometric accuracy of LSST, and that a surprisingly large number of LSST observations are in fact likely be conducted atX≳ 1.2, in spite of image quality requirements pushing the survey to preferentially lowX. Having failed to measure flare DCR in LSST precursor surveys, we make recommendations on survey design and data products that enable these studies in LSST and other future surveys.more » « less
-
null (Ed.)The Legacy Survey of Space and Time, operated by the Vera C. Rubin Observatory, is a 10-year astronomical survey due to start operations in 2022 that will image half the sky every three nights. LSST will produce ~20TB of raw data per night which will be calibrated and analyzed in almost real-time. Given the volume of LSST data, the traditional subset-download-process paradigm of data reprocessing faces significant challenges. We describe here, the first steps towards a gateway for astronomical science that would enable astronomers to analyze images and catalogs at scale. In this first step, we focus on executing the Rubin LSST Science Pipelines, a collection of image and catalog processing algorithms, on Amazon Web Services (AWS). We describe our initial impressions of the performance, scalability, and cost of deploying such a system in the cloud.more » « less
-
null (Ed.)Abstract We present measurements of cosmic shear two-point correlation functions (TPCFs) from Hyper Suprime-Cam Subaru Strategic Program (HSC) first-year data, and derive cosmological constraints based on a blind analysis. The HSC first-year shape catalog is divided into four tomographic redshift bins ranging from $z=0.3$ to 1.5 with equal widths of $$\Delta z =0.3$$. The unweighted galaxy number densities in each tomographic bin are 5.9, 5.9, 4.3, and $$2.4\:$$arcmin$$^{-2}$$ from the lowest to highest redshifts, respectively. We adopt the standard TPCF estimators, $$\xi _\pm$$, for our cosmological analysis, given that we find no evidence of significant B-mode shear. The TPCFs are detected at high significance for all 10 combinations of auto- and cross-tomographic bins over a wide angular range, yielding a total signal-to-noise ratio of 19 in the angular ranges adopted in the cosmological analysis, $$7^{\prime }<\theta <56^{\prime }$$ for $$\xi _+$$ and $$28^{\prime }<\theta <178^{\prime }$$ for $$\xi _-$$. We perform the standard Bayesian likelihood analysis for cosmological inference from the measured cosmic shear TPCFs, including contributions from intrinsic alignment of galaxies as well as systematic effects from PSF model errors, shear calibration uncertainty, and source redshift distribution errors. We adopt a covariance matrix derived from realistic mock catalogs constructed from full-sky gravitational lensing simulations that fully account for survey geometry and measurement noise. For a flat $$\Lambda$$ cold dark matter model, we find $$S\,_8 \equiv \sigma _8\sqrt{\Omega _{\rm m}/0.3}=0.804_{-0.029}^{+0.032}$$, and $$\Omega _{\rm m}=0.346_{-0.100}^{+0.052}$$. We carefully check the robustness of the cosmological results against astrophysical modeling uncertainties and systematic uncertainties in measurements, and find that none of them has a significant impact on the cosmological constraints.more » « less
An official website of the United States government

Full Text Available